Parallel Subgradient Method for Nonsmooth Convex Optimization with a Simple Constraint

نویسنده

  • KAZUHIRO HISHINUMA
چکیده

In this paper, we consider the problem of minimizing the sum of nondifferentiable, convex functions over a closed convex set in a real Hilbert space, which is simple in the sense that the projection onto it can be easily calculated. We present a parallel subgradient method for solving it and the two convergence analyses of the method. One analysis shows that the parallel method with a small constant step size approximates a solution to the problem. The other analysis indicates that the parallel method with a diminishing step size converges to a solution to the problem in the sense of the weak topology of the Hilbert space. Finally, we numerically compare our method with the existing method and state future work on parallel subgradient methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions

Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...

متن کامل

An Optimal Subgradient Algorithm for Large-scale Convex Optimization in Simple Domains

This paper shows that the optimal subgradient algorithm, OSGA, proposed in [59] can be used for solving structured large-scale convex constrained optimization problems. Only firstorder information is required, and the optimal complexity bounds for both smooth and nonsmooth problems are attained. More specifically, we consider two classes of problems: (i) a convex objective with a simple closed ...

متن کامل

Randomized Block Subgradient Methods for Convex Nonsmooth and Stochastic Optimization

Block coordinate descent methods and stochastic subgradient methods have been extensively studied in optimization and machine learning. By combining randomized block sampling with stochastic subgradient methods based on dual averaging ([22, 36]), we present stochastic block dual averaging (SBDA)—a novel class of block subgradient methods for convex nonsmooth and stochastic optimization. SBDA re...

متن کامل

Proximal point algorithms for nonsmooth convex optimization with fixed point constraints

The problem of minimizing the sum of nonsmooth, convex objective functions defined on a real Hilbert space over the intersection of fixed point sets of nonexpansive mappings, onto which the projections cannot be efficiently computed, is considered. The use of proximal point algorithms that use the proximity operators of the objective functions and incremental optimization techniques is proposed...

متن کامل

Solving generation expansion planning problems with environmental constraints by a bundle method

We discuss the energy generation expansion planning with environmental constraints, formulated as a nonsmooth convex constrained optimization problem. To solve such problems, methods suitable for constrained nonsmooth optimization need to be employed. We describe a recently developed approach, which applies the usual unconstrained bundle techniques to a dynamically changing “improvement functio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015